IS

Munro, Malcolm C.

Topic Weight Topic Terms
0.282 factors success information critical management implementation study factor successful systems support quality variables related results
0.225 research information systems science field discipline researchers principles practice core methods area reference relevance conclude
0.216 instrument measurement factor analysis measuring measures dimensions validity based instruments construct measure conceptualization sample reliability
0.215 information approach article mis presents doctoral dissertations analysis verification management requirements systems list needs including
0.208 technology organizational information organizations organization new work perspective innovation processes used technological understanding technologies transformation
0.204 competence experience versus individual disaster employees form npd concept context construct effectively focus functionalities front-end
0.196 response responses different survey questions results research activities respond benefits certain leads two-stage interactions study
0.151 results study research experiment experiments influence implications conducted laboratory field different indicate impact effectiveness future
0.145 managers managerial manager decisions study middle use important manager's appropriate importance context organizations indicate field
0.144 intelligence business discovery framework text knowledge new existing visualization based analyzing mining genetic algorithms related
0.125 planning strategic process management plan operational implementation critical used tactical effectiveness number identified activities years
0.121 adoption diffusion technology adopters innovation adopt process information potential innovations influence new characteristics early adopting
0.117 results study research information studies relationship size variables previous variable examining dependent increases empirical variance
0.102 mis problems article systems management edp managers organizations ;br> data survey application examines need experiences

Focal Researcher     Coauthors of Focal Researcher (1st degree)     Coauthors of Coauthors (2nd degree)

Note: click on a node to go to a researcher's profile page. Drag a node to reallocate. Number on the edge is the number of co-authorships.

Huff, Sid L. 3 Compeau, Deborah R. 1 Marcolin, Barbara L. 1 Newsted, Peter R. 1
Wheeler, Basil R. 1
Competence 1 critical success factors 1 Empirical 1 End-User Computing 1
information analysis 1 Information requirements 1 information technology adoption 1 management control 1
organizational impact. 1 planning 1 questionnaire development. 1 research methodology 1
Self-Efficacy 1 Software Skills 1 systems analysis 1 Survey research 1
Theoretical Framework 1 Technology transfer 1

Articles (4)

Assessing User Competence: Conceptualization and Measurement. (Information Systems Research, 2000)
Authors: Abstract:
    Organizations today face great pressure to maximize the benefits from their investments in information technology (IT). They are challenged not just to use IT, but to use it as effectively as possible. Understanding how to assess the competence of users is critical in maximizing the effectiveness of IT use. Yet the user competence construct is largely absent from prominent technology acceptance and fit models, poorly conceptualized, and inconsistently measured. We begin by presenting a conceptual model of the assessment of user competence to organize and clarify the diverse literature regarding what user competence means and the problems of assessment. As an illustrative study, we then report the findings from an experiment involving 66 participants. The experiment was conducted to compare empirically two methods (paper and pencil tests versus self-report questionnaire), across two different types of software, or domains of knowledge (word processing versus spreadsheet packages), and two different conceptualizations of competence (software knowledge versus self-efficacy). The analysis shows statistical significance in all three main effects. How user competence is measured, what is measured, what measurement context is employed: all influence the measurement outcome. Furthermore, significant interaction effects indicate that different combinations of measurement methods, conceptualization, and knowledge domains produce different results. The concept of frame of reference, and its anchoring effect on subjects' responses, explains a number of these findings. The study demonstrates the need for clarity in both defining what type of competence is being assessed and in drawing conclusions regarding competence, based upon the types of measures used. Since the results suggest that definition and measurement of the user competence construct can change the ability score being captured, the existing information system (IS) models of usage must contain the concept of an ability rating. We conclude by discussing how user competence can be incorporated into the Task-Technology Fit model, as well as additional theoretical and practical implications of our research.
Survey Instruments in Information Systems. (MIS Quarterly, 1998)
Authors: Abstract:
    Due to the popularity of survey research in information systems we have launched a compilation of survey instruments and related information. This work started in 1988, as the disk-based Calgary Surveys Query System, and has now been extended to the world wide web via a contribution of "living scholarship" to MISQ Discovery. This work includes actual IS survey instruments--either in full text or via links to the appropriate citations--as well as introductory information to help get researchers started with the survey methodology.
Information Technology Assessment and Adoption: A Field Study. (MIS Quarterly, 1985)
Authors: Abstract:
    This article presents the results of a field study examining the strategies and mechanisms used by major companies for identifying, assessing, and adopting new information technology. The principle finding is the identification of several generic models which reveal the driving forces for new technology adoption. The article also describes phases in the adoption process, organizational roles, and information gathering mechanisms. This new line of research in MIS parallels and builds upon technology transfer research and marketing studies in the area of organizational buying behavior. The purpose of this work is to assist organizations with the challenge of coping with rapidly changing information technology. This article reports on a study of the organizational processes involved in information technology assessment and adoption (ITAA). In particular, data from a series of field investigations are analyzed and a set of simple process models are proposed. These models constitute generalizations of the ITAA approaches taken by the organizations studied. The models describe the major approaches to ITAA -- both the activities themselves as well as management of the activities -- as observed in the companies studied.
Planning, Critical Success Factors, and Management Information Requirements. (MIS Quarterly, 1980)
Authors: Abstract:
    Focusing on a manager's goals and critical success factors has been advocated as an approach to defining senior and middle managers' information requirements. In this article a field study is described in which the planning processes in a corporation were used as a mechanism for identifying goals, critical success factors, and performance measures and standards, i.e., information requirements for managerial control. A genera/ approach generated from the field study is described and the advantages and disadvantages of the approach are analyzed.